Leaky ReLU
Leaky Rectified Linear Unit
$ f(x) = \begin{cases} x, & \text{if } x \geq 0 \\ \alpha x, & \text{if } x < 0 \end{cases}
αは小さな正の定数で、通常は0.01に設定される
Leaky ReLUは負の値に対して小さな値を漏らす
実装例
code:PyTorch.py
import torch.nn as nn
leaky_relu = nn.LeakyReLU(negative_slope=0.01)
code:TensorFlow/Keras.py
from tensorflow.keras.layers import LeakyReLU
leaky_relu = LeakyReLU(alpha=0.01)